# Korean Text Generation
Mamba Ko 2.8b
Apache-2.0
Mamba-ko-2.8B is a Korean pre-trained model based on state space models, trained using the synthetically generated dataset korean_textbooks.
Large Language Model
Transformers Korean

M
kuotient
24
18
T5 V1 1 Base Ko Chat
Apache-2.0
This is a Korean text generation model released under the Apache 2.0 license, specializing in Korean text-to-text generation tasks.
Text Generation
Transformers Korean

T
sangmin6600
1,773
3
Llama 2 Ko 7b
LLaMA-2-Korean Version is an advanced iteration of LLaMA 2, optimized for Korean text generation tasks through vocabulary expansion and additional Korean corpus pre-training.
Large Language Model
Transformers Supports Multiple Languages

L
beomi
3,451
175
Koalpaca Llama 1 7b
Apache-2.0
KoAlpaca is the Korean version of the Stanford Alpaca model, combining the LLAMA architecture with Polyglot-ko technology, specifically optimized for Korean text generation tasks.
Large Language Model
Transformers Supports Multiple Languages

K
beomi
213
28
Kobart Base V2
MIT
KoBART is a Korean encoder-decoder language model based on the BART architecture, trained with text infilling noise functions, supporting Korean text feature extraction and generation tasks.
Large Language Model
Transformers Korean

K
gogamza
5,937
34
Kobart Base V1
MIT
KoBART is a Korean pretrained model based on the BART architecture, suitable for various Korean natural language processing tasks.
Large Language Model
Transformers Korean

K
gogamza
2,077
1
Kogpt2 Base V2
KoGPT2 is a Korean GPT-2 model developed by SKT-AI, based on the Transformer architecture, suitable for various Korean text generation tasks.
Large Language Model Korean
K
skt
105.25k
47
Featured Recommended AI Models